Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gesture and register control refactor #38

Closed
wants to merge 8 commits into from
Closed

Gesture and register control refactor #38

wants to merge 8 commits into from

Conversation

fivesixzero
Copy link

@fivesixzero fivesixzero commented Nov 20, 2021

Closed in favor of proceeding with #39, which implements similar changes much more efficiently

Note: This includes the changes in #37 as well.

If anyone wants to follow my trail down the rabbit hole with this sensor I've been keeping updated notes, including a Jupyter Notebook illustrating processing of gesture data.

https://github.com/fivesixzero/circuitpython-experiments/tree/main/clue-nrf52/apds9960-testing

Apologies for the big PR here. While testing my prior PRs I ran into a bunch of unexpected behavior in the driver/sensor that led me to dive in for over a week. Its been a useful learning exercise and I'm hoping that the dozens of hours I've sunk into getting to know this sensor can be useful to the community!

With that in mind, I've refactored gesture engine to address several problems in the current driver code.

Here are the primary issues this PR is aiming to address:

  1. Current code returns very unpredictable gesture results
    • No issue logged, but this seems to be the consensus when I've asked other devs
  2. The gesture engine loops continuously after a gesture() call, effectively locking up the sensor until power cycle
  3. Data read from the gesture FIFOs doesn't consider overflow state or the possibility we may be reading in data mid-stream
    • No issue logged, but this is a core contributor to the above item and basically prohibits generation of deterministic results
  4. Register configuration is incomplete and largely undocumented with some confusion-inducing inaccuracies

In addition to addressing the above this PR includes improvements in a few other areas.

  • Provide well-documented properties for user control for tuning of important config registers
    • This sensor is really complicated! The datasheet is thorough but getting to know the sensor well has required a lot of experimentation to get a full understanding of how it works at a low level.
    • The config registers can have some pretty counterintuitive effects on real-world sensor behavior. The bulk of trial/error experimentation time has been spent on understanding (and documenting) the important configs more clearly.
  • Update examples to illustrate the some effective ways to use this rather complex sensor with CircuitPython
    • This sensor can be configured in a ton of different ways so it makes sense to call out a few interesting low-code use cases in examples, if that's appropriate
    • So far I've only added a proximity-triggered keypad example but, with proper configuration, this sensor can be used for a bunch of other interesting things without needing to grok the internals

There are still some items to discuss and address before this can be moved out of draft state. I'm hoping reviewers can add to this list as well. :)

  • File size and memory optimizations are critically important before this is ready for prime time
    • This is the primary blocker since it'll break the Proximity Trinkey firmware build (which is the only board that freezes it in) and generally prohibits this driver's use on memory constrained platforms like SAMD21.
    • Using only the proximity engine requires very little setup or driver code, with the color engine increasing complexity qutie a bit and the gesture engine increasing complexity and code somewhat hugely.
    • It might be useful to things out into a minimal-footprint basic (ie, proximity only) and advanced (color, gesture) classes/subclasses, similar to the BME280 driver
  • Gesture data processing is substantially improved by still very simplistic
    • The data returned from the sensor's gesture engine via the improved data acquisition code offers us a lot of processing options! With more powerful platforms like RP2040 we could do some cool things to enhance this processing or at least offer the raw gesture data frames to driver users to handle as they please.
    • This was fun to play with in a Jupyter Notebook with some graphing to illustrate some simple 'signal' vs 'noise' processing
  • Color/light engine & register control improvements are still in progress
    • I started with proximity and gesture enhancements since they looked like the most problematic portions of the driver but color detection could definitely use some attention as well
    • This is another area where returning basic values is easy but we can offer driver users a lot more useful/consistent output with floating-point math and other post-acquisition processing
  • Final pass on all docstrings for formatting and accuracy
    • Making these useful and accurate is a big part of this effort so it'll be important to make a final, comprehensive pass before un-drafting.

@tannewt tannewt requested a review from a team November 22, 2021 18:51
@fivesixzero
Copy link
Author

All of the above tasks were completed on a clean branch, so this draft PR is effectively dead. It was way easier to stack changes one at a time rather than work backwards from this massive prospective refactor. :)

The new PR should be up shortly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant